摘要 :
Airborne software development, according to the current prescriptive certification standards, imposes restrictions, among others, on emerging technologies. Enabling new trends in software engineering, reducing cost, fostering inno...
展开
Airborne software development, according to the current prescriptive certification standards, imposes restrictions, among others, on emerging technologies. Enabling new trends in software engineering, reducing cost, fostering innovations, and increasing safety is the holy grail for airborne software developers. An Overarching Properties-based process is a technology-independent approach that defines three quality properties the product must have to achieve certification approval. It is yet to be adequately investigated and officially regulated. This work explores qualifying a software tool with Overarching Properties based on the results of System-Theoretic Process Analysis. The objective is to provide confidence in the tool utilization by systematically analyzing risk factors and proving that the developer or the user has mitigated them. The argument will be structured using the Overarching Properties. We use experience from qualifying the software tools according to DO-330/ED-215 as a benchmark to evaluate our attempt in terms of cost, efficiency, and reusability. In addition, this experiment will expose problems that may hinder realizing a flexible new development/approval framework.
收起
摘要 :
Airborne software development, according to the current prescriptive certification standards, imposes restrictions, among others, on emerging technologies. Enabling new trends in software engineering, reducing cost, fostering inno...
展开
Airborne software development, according to the current prescriptive certification standards, imposes restrictions, among others, on emerging technologies. Enabling new trends in software engineering, reducing cost, fostering innovations, and increasing safety is the holy grail for airborne software developers. An Overarching Properties-based process is a technology-independent approach that defines three quality properties the product must have to achieve certification approval. It is yet to be adequately investigated and officially regulated. This work explores qualifying a software tool with Overarching Properties based on the results of System-Theoretic Process Analysis. The objective is to provide confidence in the tool utilization by systematically analyzing risk factors and proving that the developer or the user has mitigated them. The argument will be structured using the Overarching Properties. We use experience from qualifying the software tools according to DO-330/ED-215 as a benchmark to evaluate our attempt in terms of cost, efficiency, and reusability. In addition, this experiment will expose problems that may hinder realizing a flexible new development/approval framework.
收起
摘要 :
Ultrasonic imaging systems usually require an array of ultrasonic transducers for data acquisition on a wide area on top of an object under investigation. The goal of an imaging algorithm is the use of reflected ultrasound data to...
展开
Ultrasonic imaging systems usually require an array of ultrasonic transducers for data acquisition on a wide area on top of an object under investigation. The goal of an imaging algorithm is the use of reflected ultrasound data to form a recognizable image. Conventional algorithms like SAFT are based on an inverse Huygens' principle and need therefore a dense measurement grid. This requires a big effort in data capturing. For simple and inexpensive measurement different strategies of imaging with reduced amount of data and examples with a manual scanning device on concrete elements are presented.
收起
摘要 :
Compliance with airworthiness regulations is of paramount importance in the aviation industry to ensure a smooth certification process and the safety of passengers and crew. However, the traditional approach to software developmen...
展开
Compliance with airworthiness regulations is of paramount importance in the aviation industry to ensure a smooth certification process and the safety of passengers and crew. However, the traditional approach to software development and certification for safety-critical systems poses challenges due to the emergence of unmanned aircraft systems, urban air mobility, and machine learning-based software. In this paper, we propose four essential criteria for designing an approach that demonstrates software compliance with current airworthiness regulations. These criteria include aircraft-focused analysis, multi-level safety considerations, qualitative evidence of compliance, and an integrated supportive framework. By adhering to these criteria, developers can tailor their compliance strategies to their unique circumstances while maintaining safety standards. Furthermore, we explore the integration of these criteria with the Overarching Properties (OPs) to provide a structured and systematic approach for proving compliance at every stage of the software’s life cycle.
收起
摘要 :
The Information and Communication Technology (ICT) subject is offered in secondary schools in theMalaysian public school system. The subject is taught by teachers trained in ICT at a number of publicuniversities which conduct thei...
展开
The Information and Communication Technology (ICT) subject is offered in secondary schools in theMalaysian public school system. The subject is taught by teachers trained in ICT at a number of publicuniversities which conduct their programs according to their individual curricula as there is no commonstandard. This research has developed a standard for Pedagogical Content Knowledge (PCK) for ICTteachers in Malaysia. A total of 245 expert ICT teachers were involved in the research carried out fromOctober to November of 2011. Three main domains were considered: (a) Knowledge about students,(b) Curriculum knowledge, and (c) Knowledge of instruction planning and implementation. Three-cycleDelphi method was used to obtain a consensus for the standard. The results show that there are 13vital knowledge items required by ICT teachers, 45 must-have knowledge items and one moderateitem. These results are important references in formulating teacher education programs in Malaysia.
收起
摘要 :
The Information and Communication Technology (ICT) subject is offered in secondary schools in theMalaysian public school system. The subject is taught by teachers trained in ICT at a number of publicuniversities which conduct thei...
展开
The Information and Communication Technology (ICT) subject is offered in secondary schools in theMalaysian public school system. The subject is taught by teachers trained in ICT at a number of publicuniversities which conduct their programs according to their individual curricula as there is no commonstandard. This research has developed a standard for Pedagogical Content Knowledge (PCK) for ICTteachers in Malaysia. A total of 245 expert ICT teachers were involved in the research carried out fromOctober to November of 2011. Three main domains were considered: (a) Knowledge about students,(b) Curriculum knowledge, and (c) Knowledge of instruction planning and implementation. Three-cycleDelphi method was used to obtain a consensus for the standard. The results show that there are 13vital knowledge items required by ICT teachers, 45 must-have knowledge items and one moderateitem. These results are important references in formulating teacher education programs in Malaysia.
收起
摘要 :
A video based flare image monitoring system is developed for real-time estimation of the flare gas flow rate at the edge. Depending on the desired trade-off between speed and accuracy, either an object detection (EfficientDet Dx) ...
展开
A video based flare image monitoring system is developed for real-time estimation of the flare gas flow rate at the edge. Depending on the desired trade-off between speed and accuracy, either an object detection (EfficientDet Dx) or instance segmentation (Mask R-CNN) model is used for real-time detection of flare and smoke instances in the input video stream. Organic and synthetic data is used to achieve high precision and recall (greater than 0.98) for both flare and smoke. The detected rectangular bounding boxes or polygon masks are used to estimate the flame size, and predict the flare gas exit velocity or equivalently flow rate. The estimated flow rate is within +/−10% of a reference flow meter. The Deep Learning models are “edgified” in order to shrink the size and improve the inference speed by ~3X on small footprint edge devices. The deep learning models of the flame size are combined with first principles knowledge to estimate the flare volumetric flow.1.
收起
摘要 :
When attacking the problem of live information dissemination, then publish/subscribe technology plays a key role in crafting an efficient solution. Especially in safety-critical domains like automotive embedded systems a key facto...
展开
When attacking the problem of live information dissemination, then publish/subscribe technology plays a key role in crafting an efficient solution. Especially in safety-critical domains like automotive embedded systems a key factor for an efficient publish/subscribe mechanisms is type-safety. We introduce a solution for type-based and semantic publish/subscribe which allows to create hierarchical channels tailored to the needs of an embedded system with physical entities communicating. Our concept builds up on our dynamic adaptive middleware called Dynamic Adaptive System Infrastructure (DAiSI), which allows component configuration at runtime. As a technical medium, we use the industrial standard Extensible Lightweight Asynchronous Protocol (Exlap). Regardless of the fact that our implementation and example pertain to DAiSI and Exlap, our concept is introduced in an integrated framework, which allows the reusability of this model in other application domains.
收起
摘要 :
Oil and gas industry by all means is a data-driven industry as it depends massively on information technology. According to Saudi ARAMCO, the amount of data coming from upstream only is doubling every two years. This data arrives ...
展开
Oil and gas industry by all means is a data-driven industry as it depends massively on information technology. According to Saudi ARAMCO, the amount of data coming from upstream only is doubling every two years. This data arrives through wide variety of vendors’ sources and is handled by different applications repositories. Well data, in specific, is a key asset for such industry throughout the process lifetime from early exploration to production. In practice, companies often tend to create their own well data master repositories that are poorly synchronized between each other and with other databases. This results into well data residing in silos databases with no commonly defined standard to work with nor a mechanism to cross-validate well data quality across various sources. Thus, maintaining high quality level of definitive versions of well data is a critical activity to any firm’s data management strategy and persists to be a challenge in such dynamically growing industry. Recently, Big Data technologies have evolved to quickly fetch and analyze large volumes of data that can substantially lead to an improved data quality at reasonable time. In this paper, a novel system is presented that attempts to preserve high level of well data quality in heterogeneous environment. This system utilizes Apache Spark as a main framework for distributed processing and TIBCO OpenSpirit as an integration mid-tier software layer. Through a set of defined mapping rules, the system will compare data from multiple databases against one database that is conventionally known to host the organizational golden data. It is typical for oil and gas companies to dedicate one master database containing the corporate standard well data. So, this database will be used as a source of comparison against well data residing in project repositories. Moreover, the System extends in functionality to cover well sub data types such as headers, check shots, deviation surveys and picks. The final output is a data quality report that helps in making strategic decisions.
收起
摘要 :
Oil and gas industry by all means is a data-driven industry as it depends massively on information technology. According to Saudi ARAMCO, the amount of data coming from upstream only is doubling every two years. This data arrives ...
展开
Oil and gas industry by all means is a data-driven industry as it depends massively on information technology. According to Saudi ARAMCO, the amount of data coming from upstream only is doubling every two years. This data arrives through wide variety of vendors’ sources and is handled by different applications repositories. Well data, in specific, is a key asset for such industry throughout the process lifetime from early exploration to production. In practice, companies often tend to create their own well data master repositories that are poorly synchronized between each other and with other databases. This results into well data residing in silos databases with no commonly defined standard to work with nor a mechanism to cross-validate well data quality across various sources. Thus, maintaining high quality level of definitive versions of well data is a critical activity to any firm’s data management strategy and persists to be a challenge in such dynamically growing industry. Recently, Big Data technologies have evolved to quickly fetch and analyze large volumes of data that can substantially lead to an improved data quality at reasonable time. In this paper, a novel system is presented that attempts to preserve high level of well data quality in heterogeneous environment. This system utilizes Apache Spark as a main framework for distributed processing and TIBCO OpenSpirit as an integration mid-tier software layer. Through a set of defined mapping rules, the system will compare data from multiple databases against one database that is conventionally known to host the organizational golden data. It is typical for oil and gas companies to dedicate one master database containing the corporate standard well data. So, this database will be used as a source of comparison against well data residing in project repositories. Moreover, the System extends in functionality to cover well sub data types such as headers, check shots, deviation surveys and picks. The final output is a data quality report that helps in making strategic decisions.
收起